Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Gradient-based deep network pruning algorithm
WANG Zhongfeng, XU Zhiyuan, SONG Chunhe, ZHANG Hongyu, CAI Yingkai
Journal of Computer Applications    2020, 40 (5): 1253-1259.   DOI: 10.11772/j.issn.1001-9081.2019081374
Abstract693)      PDF (772KB)(665)       Save

Deep neural network models usually have a large number of redundant weight parameters. Calculating the deep network model requires a large amount of computing resources and storage pace, which makes the deep network model difficult to be deployed on some edge devices and embedded devices. To resolve this issue, a Gradient-based Deep network Pruning (GDP) algorithm was proposed. The core idea of GDP algorithm was to use the gradient as the basis for judging the importance of each weight. To eliminate the weights corresponding to the gradients smaller than the threshold, an adaptive method was used to find the threshold to screen the weights. The deep network model was retrained after pruning to restore the network performance. The experimental results show that the GDP algorithm reduces the computational cost by 35.3 percentage points with a precision loss of only 0.14 percentage points on the CIFAR-10 dataset. Compared with the state-of-the-art PFEC (Pruning Filters for Efficient ConvNets) algorithm, the GDP algorithm increases the network model accuracy by 0.13 percentage points, and reduces the computational cost by 1.1 percentage points, indicating that the proposed algorithm has superior performance of deep network in terms of both compression and acceleration.

Reference | Related Articles | Metrics